40 research outputs found

    A tight security reduction in the quantum random oracle model for code-based signature schemes

    Get PDF
    Quantum secure signature schemes have a lot of attention recently, in particular because of the NIST call to standardize quantum safe cryptography. However, only few signature schemes can have concrete quantum security because of technical difficulties associated with the Quantum Random Oracle Model (QROM). In this paper, we show that code-based signature schemes based on the full domain hash paradigm can behave very well in the QROM i.e. that we can have tight security reductions. We also study quantum algorithms related to the underlying code-based assumption. Finally, we apply our reduction to a concrete example: the SURF signature scheme. We provide parameters for 128 bits of quantum security in the QROM and show that the obtained parameters are competitive compared to other similar quantum secure signature schemes

    The problem with the SURF scheme

    Get PDF
    There is a serious problem with one of the assumptions made in the security proof of the SURF scheme. This problem turns out to be easy in the regime of parameters needed for the SURF scheme to work. We give afterwards the old version of the paper for the reader's convenience.Comment: Warning : we found a serious problem in the security proof of the SURF scheme. We explain this problem here and give the old version of the paper afterward

    Worst and Average Case Hardness of Decoding via Smoothing Bounds

    Get PDF
    In this work, we consider the worst and average case hardness of the decoding problems that are the basis for code-based cryptography. By a decoding problem, we consider inputs of the form (G,mG+t)(\mathbf{G}, \mathbf{m} \mathbf{G} + \mathbf{t}) for a matrix G\mathbf{G} that generates a code and a noise vector t\mathbf{t}, and the algorithm\u27s goal is to recover m\mathbf{m}. We consider a natural strategy for creating a reduction to an average-case problem: from our input we simulate a Learning Parity with Noise (LPN) oracle, where we recall that LPN is essentially an average-case decoding problem where there is no a priori lower bound on the rate of the code. More formally, the oracle Ox\mathcal{O}_{\mathbf{x}} outputs independent samples of the form x,a+e\langle \mathbf{x}, \mathbf{a} \rangle + e, where a\mathbf{a} is a uniformly random vector and ee is a noise bit. Such an approach is (implicit in) the previous worst-case to average-case reductions for coding problems (Brakerski et al Eurocrypt 2019, Yu and Zhang CRYPTO 2021). To analyze the effectiveness of this reduction, we use a smoothing bound derived recently by (Debris-Alazard et al IACR Eprint 2022), which quantifies the simulation error of this reduction. It is worth noting that this latter work crucially use a bound, known as the second linear programming bounds, on the weight distribution of the code generated here by G\mathbf{G}. Our approach, which is Fourier analytic in nature, applies to any smoothing distribution (so long as it is radial); for our purposes, the best choice appears to be Bernoulli (although for the analysis it is most effective to study the uniform distribution over a sphere, and subsequently translate the bound back to the Bernoulli distribution by applying a truncation trick). Our approach works naturally when reducing from a worst-case instance, as well as from an average-case instance. While we are unable to improve the parameters of the worst-case to average-case reductions of Brakerski et al or Yu and Zhang, we think that our work highlights two important points. Firstly, in analyzing the average-case to average-case reduction we run into inherent limitations of this reduction template. Essentially, it appears hopeless to reduce to an LPN instance for which the noise rate is more than inverse-polynomially biased away from uniform. We furthermore uncover a surprising weakness in the second linear programming bound: we observe that it is essentially useless for the regime of parameters where the rate of the code is inverse polynomial in the block-length. By highlighting these shortcomings, we hope to stimulate the development of new techniques for reductions between cryptographic decoding problems

    Statistical Decoding

    Get PDF
    International audienceThe security of code-based cryptography relies primarily on the hardness of generic decoding with linear codes. The best generic decoding algorithms are all improvements of an old algorithm due to Prange: they are known under the name of information set decoding techniques (ISD). A while ago a generic decoding algorithm which does not belong to this family was proposed: statistical decoding. It is a randomized algorithm that requires the computation of a large set of parity-check equations of moderate weight. We solve here several open problems related to this decoding algorithm. We give in particular the asymptotic complexity of this algorithm, give a rather efficient way of computing the parity-check equations needed for it inspired by ISD techniques and give a lower bound on its complexity showing that when it comes to decoding on the Gilbert-Varshamov bound it can never be better than Prange's algorithm

    Pseudorandomness of Decoding, Revisited: Adapting OHCP to Code-Based Cryptography

    Get PDF
    Recent code-based cryptosystems rely, among other things, on the hardness of the decisional decoding problem. If the search version is well understood, both from practical and theoretical standpoints, the decision version has been less studied in the literature, and little is known about its relationships with the search version, especially for structured variants. On the other hand, in the world of Euclidean lattices, the situation is rather different, and many reductions exist, both for unstructured and structured versions of the underlying problems. For the latter versions, a powerful tool called the OHCP framework (for Oracle with Hidden Center Problem), which appears to be very general, has been introduced by Peikert et al. (STOC 2017) and has proved to be very useful as a black box inside reductions. In this work, we revisit this technique and extract the very essence of this framework, namely the Oracle Comparison Problem (OCP), to show how to recover the support of the error, solving an Oracle with Hidden Support Problem (OHSP), more suitable for code-based cryptography. This yields a new worst-case to average-case search-to-decision reduction. We then turn to the structured versions and explain why this is not as straightforward as for Euclidean lattices. If we fail to give a search-to-decision reduction for structured codes, we believe that our work opens the way towards new reductions for structured codes, given that the OHCP framework proved to be so powerful in lattice-based cryptography. Furthermore, we also believe that this technique could be extended to codes endowed with other metrics, such as the rank metric, for which no reduction is known

    Wave: A New Code-Based Signature Scheme

    Get PDF
    preprint IACR disponible sur https://eprint.iacr.org/2018/996/20181022:154324We present here Wave the first "hash-and-sign" code-based signature scheme which strictly follows the GPV strategy [GPV08]. It uses the family of ternary generalized (U, U + V) codes. We prove that Wave achieves existential unforgeability under adaptive chosen message attacks (EUF-CMA) in the random oracle model (ROM) with a tight reduction to two assumptions from coding theory: one is a distinguishing problem that is related to the trapdoor we insert in our scheme, the other one is DOOM, a multiple target version of syndrome decoding. The algorithm produces uniformly distributed signatures through a suitable rejection sampling. Our scheme enjoys efficient signature and verification algorithms. For 128 bits of classical security, signature are 8 thousand bits long and the public key size is slightly smaller than one megabyte. Furthermore, with our current choice of parameters, the rejection rate is limited to one rejection every 3 or 4 signatures

    This is Not an Attack on Wave

    Get PDF
    Very recently, a preprint ``Cryptanalysis of the Wave Signature Scheme\u27\u27, eprint 2018/1111, appeared claiming to break Wave ``Wave: A New Code-Based Signature Scheme\u27\u27, eprint 2018/996. We explain here why this claim is incorrect

    SURF: A new code-based signature scheme

    Get PDF
    There is a serious problem with one of the assumptions made in the security proof of the SURF scheme. This problem turns out to be easy in the regime of parameters needed for the SURF scheme to work. We give afterwards the old version of the paper for the reader's convenience

    About Wave Implementation and its Leakage Immunity

    Get PDF
    Wave is a recent digital signature scheme [3]. It is based on a family of trapdoor one-way Preimage Sampleable Functions and is proven EUF-CMA in the random oracle model under two code-based computational assumptions. One of its key properties is to produce signatures uniformly distributed of fixed Hamming weight. This property implies that, if properly implemented, Wave is immune to leakage attack. We describe here the key stages for the implementation of the Wave trapdoor inverse function to integrate all the features to achieve leakage-freeness. A proof of concept implementation was made in SageMath. It allowed us to check that properly generated Wave signatures are uniformly distributed. In particular, we show that the signatures produced by this implementation defeat the Barreto-Persichetti attack. We show which features of the Wave specification were improperly put aside and explain why the claim of breaking Wave is incorrect. Preliminary Statement. We consider here the first version of Wave (v1 of [3]). This is the version which is allegedly broken in [1]. All the features that guaranty the absence of leakage are already there. It considers a strict (U, U + V) code (i.e. not generalized) and has no information set gap. The description of the decoder for generalized (U, U + V) codes only appeared in the v2. The information set gap (denoted d) appeared in third version of Wave. The information set gap was introduced to give a provably small upper bound for the statistical distance between the signatures distribution and uniform distribution. We conjecture that this statistical distance is negligible even with a zero gap. Our implementation is available at the following URL: https://project.inria.fr/wave and includes the three versions
    corecore